Exploiting Determinism to Scale Relational Inference
نویسندگان
چکیده
One key challenge in statistical relational learning (SRL) is scalable inference. Unfortunately, most realworld problems in SRL have expressive models that translate into large grounded networks, representing a bottleneck for any inference method and weakening its scalability. In this paper we introduce Preference Relaxation (PR), a two-stage strategy that uses the determinism present in the underlying model to improve the scalability of relational inference. The basic idea of PR is that if the underlying model involves mandatory (i.e. hard) constraints as well as preferences (i.e. soft constraints) then it is potentially wasteful to allocate memory for all constraints in advance when performing inference. To avoid this, PR starts by relaxing preferences and performing inference with hard constraints only. It then removes variables that violate hard constraints, thereby avoiding irrelevant computations involving preferences. In addition it uses the removed variables to enlarge the evidence database. This reduces the effective size of the grounded network. Our approach is general and can be applied to various inference methods in relational domains. Experiments on real-world applications show how PR substantially scales relational inference with a minor impact on accu-
منابع مشابه
Efficient Probabilistic Inference for Dynamic Relational Models
Over the last couple of years, the interest in combining probability and logic has grown strongly. This led to the development of different software packages like PRISM, ProbLog and Alchemy, which offer a variety of exact and approximate algorithms to perform inference and learning. What is lacking, however, are algorithms to perform efficient inference in relational temporal models by systemat...
متن کاملSearchable Encrypted Relational Databases: Risks and Countermeasures
We point out the risks of protecting relational databases via Searchable Symmetric Encryption (SSE) schemes by proposing an inference attack exploiting the structural properties of relational databases. We show that record-injection attacks mounted on relational databases have worse consequences than their file-injection counterparts on unstructured databases. Moreover, we discuss some techniqu...
متن کاملLearning and Exploiting Relational Structure for Efficient Inference
Learning and Exploiting Relational Structure for Efficient Inference Aniruddh Nath Chair of the Supervisory Committee: Professor Pedro Domingos Computer Science & Engineering One of the central challenges of statistical relational learning is the tradeoff between expressiveness and computational tractability. Representations such as Markov logic can capture rich joint probabilistic models over ...
متن کاملLifting Relational MAP-LPs using Cluster Signatures
Inference in large scale graphical models is an important task in many domains, and in particular probabilistic relational models (e.g. Markov logic networks). Such models often exhibit considerable symmetry, and it is a challenge to devise algorithms that exploit this symmetry to speed up inference. Recently, the automorphism group has been proposed to formalize mathematically what ”exploiting...
متن کاملGraphical models and symmetries: loopy belief propagation approaches
Whenever a person or an automated system has to reason in uncertain domains, probability theory is necessary. Probabilistic graphical models allow us to build statistical models that capture complex dependencies between random variables. Inference in these models, however, can easily become intractable. Typical ways to address this scaling issue are inference by approximate message-passing, sto...
متن کامل